Approximation with neural networks activated by ramp sigmoids

نویسنده

  • Gerald H. L. Cheang
چکیده

Accurate and parsimonious approximations for indicator functions of d-dimensional balls and related functions are given using level sets associated with the thresholding of a linear combination of ramp sigmoid activation functions. In neural network terminology, we are using a single-hidden-layer perceptron network implementing the ramp sigmoid activation function to approximate the indicator of a ball. In order to have a relative accuracy , we use T = c(d2/ 2) ramp sigmoids, a result comparable to that of Cheang and Barron (2000) [4], where unit step activation functions are used instead. The result is then applied to functions that have variation V f with respect to a class of ellipsoids. Two-hidden-layer feedforward neural nets with ramp sigmoid activation functions are used to approximate such functions. The approximation error is shown to be bounded by a constant times V f /T 1 2 1 + V f d/T 1 4 2 , where T1 is the number of nodes in the outer layer and T2 is the number of nodes in the inner layer of the approximation fT1,T2 . c © 2010 Elsevier Inc. All rights reserved.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Networks with Learned Unit Response Functions

Feedforward networks composed of units which compute a sigmoidal function of a weighted sum of their inputs have been much investigated. We tested the approximation and estimation capabilities of networks using functions more complex than sigmoids. Three classes of functions were tested: polynomials, rational functions, and flexible Fourier series. Unlike sigmoids, these classes can fit non-mon...

متن کامل

Characterization of a Class of Sigmoid Functions with Applications to Neural Networks

We study two classes of sigmoids: the simple sigmoids, defined to be odd, asymptotically bounded, completely monotone functions in one variable, and the hyperbolic sigmoids, a proper subset of simple sigmoids and a natural generalization of the hyperbolic tangent. We obtain a complete characterization for the inverses of hyperbolic sigmoids using Euler's incomplete beta functions, and describe ...

متن کامل

Function Approximation by Polynomial Wavelets Generated

Wavelet functions have been successfully used in many problems as the activation function of feedforward neural networks ZB92],,STK92], PK93]. In this paper, a family of polynomial wavelets generated from powers of sigmoids is described which provides a robust way for designing neural network architectures. It is shown, through experimentation, that function members of this family can present a...

متن کامل

Taxonomy of Neural Transfer Functions

The choice of transfer functions may strongly influence complexity and performance of neural networks used in classification and approximation tasks. A taxonomy of activation and output functions is proposed, allowing to generate many transfer functions. Several less-known types of transfer functions and new combinations of activation/output functions are described. Functions parameterize to ch...

متن کامل

STRUCTURAL DAMAGE DETECTION BY MODEL UPDATING METHOD BASED ON CASCADE FEED-FORWARD NEURAL NETWORK AS AN EFFICIENT APPROXIMATION MECHANISM

Vibration based techniques of structural damage detection using model updating method, are computationally expensive for large-scale structures. In this study, after locating precisely the eventual damage of a structure using modal strain energy based index (MSEBI), To efficiently reduce the computational cost of model updating during the optimization process of damage severity detection, the M...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Journal of Approximation Theory

دوره 162  شماره 

صفحات  -

تاریخ انتشار 2010